348 research outputs found

    Unleashing Digital Process Innovation with Process Mining: Designing a Training Concept with Action Design Research

    Get PDF
    Process mining (PM) is an emerging trend across many industries. To exploit its potential of increased transparency and organizational efficiency, PM needs to be implemented successfully. Due to its specific characteristics, knowledge on the implementation of other information systems cannot be transferred seamlessly. Applying an action design research (ADR) approach in a mixed team with the PM provider Celonis, we develop a training solution to facilitate PM implementation from a third-party implementation partner’s perspective. Therefore, we first formulate the problem by investigating challenges in the implementation process. Next, we derive a training concept as an artifact, using the theoretical foundation of IT implementation models. We evaluate the artifact, reflect on, and formalize the learning. The paper contributes to the PM knowledge base by identifying 38 implementation challenges like quantifying the value of PM and transferring those insights into practice by developing a prototype solution

    A Method and Tool for Predictive Event-Driven Process Analytics

    Get PDF
    Business value can be lost if a decision maker’s action distance to the observation of a business event is too high. So far, two classes of information systems, which promise to assist decision makers, have been discussed independently from each other only: business intelligence systems that query historic business event data in order to prepare predictions of future process behavior and real-time monitoring systems. This paper suggests using real-time data for predictions following an event-driven approach. A predictive event-driven process analytics (edPA) method is presented which integrates aspects from business activity monitoring and process intelligence. Needs for procedure integration, metric quality, and the inclusion of actionable improvements are outlined. The method is implemented in the form of a software prototype and evaluated

    Theory-based Analyses of Interorganisational Standards for Self-organising, Adaptive Value Creation Networks

    Get PDF
    Today many enterprises find themselves in situations of forming new or integrating into existing value creation networks to strengthen their market position and to provide new innovative customer solutions to its customers. Due to their high complexity, effective and efficient value creation networks rely on self-organising and adaptive structures and processes. Information flows amongst business partners and the coordination of these flows by cooperation activities are major design parameters of such networks. Interorganisational standards (IOS) seek to ease information infrastructure design by providing a referential frame. However, practitioners finding themselves in situations of selecting specific standards and thereby deciding against others, so far lack sufficient theoretical guidance in this selection problem. This research informs the IOS selection problem by condensing insights from the body of knowledge from management cybernetics and coordination theory and identifying first requirements to a method guiding IOS choices

    PREDICTIVE BUSINESS PROCESS MONITORINGWITH CONTEXT INFORMATION FROM DOCUMENTS

    Get PDF
    Predictive business process monitoring deals with predicting a process’s future behavior or the value of process-related performance indicators based on process event data. A variety of prototypical predictive business process monitoring techniques has been proposed by researchers in order to help process participants performing business processes better. In practical settings, these techniques have a low predictive quality that is often close to random, so that predictive business process monitoring applications are rare in practice. The inclusion of process-context data has been discussed as a way to improve the predictive quality. Existing approaches have considered only structured data as context. In this paper, we argue that process-related unstructured documents are also a promising source for extracting process-context data. Accordingly, this research-in-progress paper outlines a design-science research process for creating a predictive business process monitoring technique that utilizes context data from process-related documents to predict a process instance’s next activity more accurately

    Conformance checking: A state-of-the-art literature review

    Full text link
    Conformance checking is a set of process mining functions that compare process instances with a given process model. It identifies deviations between the process instances' actual behaviour ("as-is") and its modelled behaviour ("to-be"). Especially in the context of analyzing compliance in organizations, it is currently gaining momentum -- e.g. for auditors. Researchers have proposed a variety of conformance checking techniques that are geared towards certain process model notations or specific applications such as process model evaluation. This article reviews a set of conformance checking techniques described in 37 scholarly publications. It classifies the techniques along the dimensions "modelling language", "algorithm type", "quality metric", and "perspective" using a concept matrix so that the techniques can be better accessed by practitioners and researchers. The matrix highlights the dimensions where extant research concentrates and where blind spots exist. For instance, process miners use declarative process modelling languages often, but applications in conformance checking are rare. Likewise, process mining can investigate process roles or process metrics such as duration, but conformance checking techniques narrow on analyzing control-flow. Future research may construct techniques that support these neglected approaches to conformance checking

    Towards a Semantic Data Quality Management - Using Ontologies to Assess Master Data Quality in Retailing

    Get PDF
    Since its inception Information Systems has relied heavily on older, more established, reference disciplines for much of its theory development and practical application. The relationship between the economic sciences and information quality has been the subject of much of the work recognized through the Nobel Prize in Economic Sciences. Beginning with Simon’s decision-making model published before a discipline known as Information Systems existed, this paper reviews this relationship and the parallel development of information quality and computing capability from an Information System perspective and changing paradigms in economics as recognized in the works of the Nobel laureates. From economic theories based on assumed knowledge, the paradigm is shifting to methods of empirical testing and experimentation. Organizations continue to make operational and strategic decisions. Additionally, now information is being aggregated, warehoused, mined, and analyzed to make a host of societal decisions and to understand economic behaviors through experimentation and empirical analysis
    • …
    corecore